Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening
نویسندگان
چکیده
Electroencephalography (EEG)-based emotion classification during music listening has gained increasing attention nowadays due to its promise of potential applications such as musical affective brain-computer interface (ABCI), neuromarketing, music therapy, and implicit multimedia tagging and triggering. However, music is an ecologically valid and complex stimulus that conveys certain emotions to listeners through compositions of musical elements. Using solely EEG signals to distinguish emotions remained challenging. This study aimed to assess the applicability of a multimodal approach by leveraging the EEG dynamics and acoustic characteristics of musical contents for the classification of emotional valence and arousal. To this end, this study adopted machine-learning methods to systematically elucidate the roles of the EEG and music modalities in the emotion modeling. The empirical results suggested that when whole-head EEG signals were available, the inclusion of musical contents did not improve the classification performance. The obtained performance of 74~76% using solely EEG modality was statistically comparable to that using the multimodality approach. However, if EEG dynamics were only available from a small set of electrodes (likely the case in real-life applications), the music modality would play a complementary role and augment the EEG results from around 61-67% in valence classification and from around 58-67% in arousal classification. The musical timber appeared to replace less-discriminative EEG features and led to improvements in both valence and arousal classification, whereas musical loudness was contributed specifically to the arousal classification. The present study not only provided principles for constructing an EEG-based multimodal approach, but also revealed the fundamental insights into the interplay of the brain activity and musical contents in emotion modeling.
منابع مشابه
Fusion Framework for Emotional Electrocardiogram and Galvanic Skin Response Recognition: Applying Wavelet Transform
Introduction To extract and combine information from different modalities, fusion techniques are commonly applied to promote system performance. In this study, we aimed to examine the effectiveness of fusion techniques in emotion recognition. Materials and Methods Electrocardiogram (ECG) and galvanic skin responses (GSR) of 11 healthy female students (mean age: 22.73±1.68 years) were collected ...
متن کاملNeural Correlates of Boredom in Music Perception
Introduction: Music can elicit powerful emotional responses, the neural correlates of which have not been properly understood. An important aspect about the quality of any musical piece is its ability to elicit a sense of excitement in the listeners. In this study, we investigated the neural correlates of boredom evoked by music in human subjects. Methods: We used EEG recording in nine sub...
متن کاملEEG-based emotion perception during music listening
In the present study correlations between electroencephalographic (EEG) activity and emotional responses during music listening were investigated. Carefully selected musical excerpts of classical music tested in previous studies were employed as stimuli. During the experiments EEG activity was recorded in different regions without a-priori defining regions of interest. The analysis of the data ...
متن کاملبررسی الکتروانسفالوگرام شبکه موثر مغز انسان در حین گوش دادن به موسیقی به منظور تشخیص احساسات
In the current research brain effective networks related to happy and sad emotions are studied during listening to music. Connectivity patterns among different EEG channels were extracted using multivariate autoregressive modeling and partial directed coherence while participants listened to musical excerpts. Both classical and Iranian musical selections were used as stimulus. Participants’ se...
متن کاملFace Recognition, Musical Appraisal, and Emotional Crossmodal Bias
Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down ...
متن کامل